Graphical Newton
نویسندگان
چکیده
Computing the Newton step for a generic function f : R → R takes O(N) flops. In this paper, we explore avenues for reducing this bound, when the computational structure of f is known beforehand. It is shown that the Newton step can be computed in time, linear in the size of the computational-graph, and cubic in its tree-width.
منابع مشابه
Graphical Model Structure Learning with 1-Regularization
This work looks at fitting probabilistic graphical models to data when the structure is not known. The main tool to do this is `1-regularization and the more general group `1-regularization. We describe limited-memory quasi-Newton methods to solve optimization problems with these types of regularizers, and we examine learning directed acyclic graphical models with `1-regularization, learning un...
متن کاملGeneralized Newton's Method Based on Graphical Derivatives
This paper concerns developing a numerical method of the Newton type to solve systems of nonlinear equations described by nonsmooth continuous functions. We propose and justify a new generalized Newton algorithm based on graphical derivatives, which have never been used to derive a Newton-type method for solving nonsmooth equations. Based on advanced techniques of variational analysis and gener...
متن کاملSolving Log-Determinant Optimization Problems by a Newton-CG Primal Proximal Point Algorithm
We propose a Newton-CG primal proximal point algorithm for solving large scale log-determinant optimization problems. Our algorithm employs the essential ideas of the proximal point algorithm, the Newton method and the preconditioned conjugate gradient solver. When applying the Newton method to solve the inner sub-problem, we find that the log-determinant term plays the role of a smoothing term...
متن کاملImproving the Dynamics of Steffensen-type Methods
The dynamics of Steffesen-type methods, using a graphical tool for showing the basins of attraction, is presented. The study includes as particular cases, Steffesen-type modifications of the Newton, the two-steps, the Chebyshev, the Halley and the super– Halley iterative methods. The goal is to show that if we are interesting to preserve the convergence properties we must ensure that the deriva...
متن کاملLinear-Time Algorithm for Learning Large-Scale Sparse Graphical Models
The sparse inverse covariance estimation problem is commonly solved using an l1-regularizedGaussian maximum likelihood estimator known as “graphical lasso”, but its computational cost becomes prohibitive for large data sets. A recent line of results showed–under mild assumptions–that the graphical lasso estimator can be retrieved by soft-thresholding the sample covariance matrix and solving a m...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1508.00952 شماره
صفحات -
تاریخ انتشار 2015